336 research outputs found

    A note on the economic cost of climate change and the rationale to limit it below 2°C

    Get PDF
    This note highlights a major reason to limit climate change to the lowest possible levels. This reason follows from the large increase in uncertainty associated with high levels of warming. This uncertainty arises from three sources: the change in climate itself, the change’s impacts at the sector level, and their macroeconomic costs. First, the greater the difference between the future climate and the current one, the more difficult it is to predict how local climates will evolve, making it more difficult to anticipate adaptation actions. Second, the adaptive capacity of various economic sectors can already be observed for limited warming, but is largely unknown for larger changes. The larger the change in climate, therefore, the more uncertain is the final impact on economic sectors. Third, economic systems can efficiently cope with sectoral losses, but macroeconomic-level adaptive capacity is difficult to assess, especially when it involves more than marginal economic changes and when structural economic shifts are required. In particular, these shifts are difficult to model and involve thresholds beyond which the total macroeconomic cost would rise rapidly. The existence of such thresholds is supported by past experiences, including economic disruptions caused by natural disasters, observed difficulties funding needed infrastructure, and regional crises due to rapid economic shifts induced by new technologies or globalization. As a consequence, larger warming is associated with higher cost, but also with larger uncertainty about the cost. Because this uncertainty translates into risks and makes it more difficult to implement adaptation strategies, it represents an additional motive to mitigate climate change.Climate Change Economics,Science of Climate Change,Climate Change Mitigation and Green House Gases,Adaptation to Climate Change,Transport Economics Policy&Planning

    Beyond the Stern Review: Lessons from a risky venture at the limits of the cost–benefit analysis

    Get PDF
    International audienceThis paper argues that debates amongst economists triggered by the Stern Review are partly relevant, focusing on key parameters translating real ethical issues, and partly misplaced in that they do not consider enough other determinants of climate change damages: i) the specifications of the utility function used for the assessments (preference for the environment, preference for smooth growth paths), ii) the interplay between uncertainty and the sequentiality of the decision, and iii) whether the growth engines behind the integrated assessment models can account for transient disequilibrium and sub-optimality. We derive some suggestions for any future research agenda in integrated assessment modelling, whatever the position of the analysts about the relevance of the intertemporal optimisation framework and the Bayesian approach to uncertainty in the climate affair

    Optimal control models and elicitation of attitudes towards climate damages

    Get PDF
    This paper examines the consequences of various attitudes towards climate damages through a family of stochastic optimal control models (RESPONSE): cost-efficiency for a given temperature ceiling; cost-benefit analysis with a "pure preference for current climate regime" and full cost-benefit analysis. The choice of a given proxy of climate change risks is actually more than a technical option. It is essentially motivated by the degree of distrust regarding the legitimacy of an assessment of climate damages and the possibility of providing in due time reliable and non controversial estimates. Our results demonstrate that a) for early decades abatement, the difference between various decision-making frameworks appears to matter less than the difference between stochastic and non stochastic approach given the cascade of uncertainty from emissions to damages; b) in a stochastic approach, the possibility of non-catastrophic singularities in the damage function is sufficient to significantly increase earlier optimal abatements; c) a window of opportunity for action exists up to 2040: abatements further delayed may induce significant regret in case of bad news about climate response or singularities in damages.Cost-efficiency; Cost-benefit; Climate sensitivity; Climate change damages; Uncertainty; Optimal climate policy; Decision making frameworks

    Business Cycles, Bifurcations and Chaos in a Neo-Classical Model with Investment Dynamics

    Full text link
    This paper presents a non-equilibrium dynamic model (NEDyM) that introduces investment dynamics and non-equilibrium effects into a Solow growth model. NEDyM can reproduce several typical economic regimes and, for certain ranges of parameter values, exhibits endogenous business cycles with realistic characteristics. The cycles arise from the investment-profit instability and are constrained by the increase in labor costs and the inertia of production capacity. For other parameter ranges, the model exhibits chaotic behavior. These results show that complex variability in the economic system may be due to deterministic, intrinsic factors, even if the long-term equilibrium is neo-classical in nature

    Design of Coded Slotted ALOHA with Interference Cancellation Errors

    Get PDF
    International audienceCoded Slotted ALOHA (CSA) is a random access scheme based on the application of packet erasure correcting codes to transmitted packets and the use of successive interference cancellation at the receiver. CSA has been widely studied and a common assumption is that interference cancellation can always be applied perfectly. In this paper, we study the design of CSA protocol, accounting for a non-zero probability of error due to imperfect interference cancellation (IC). A classical method to evaluate the performance of such protocols is density evolution, originating from coding theory, and that we adapt to our assumptions. Analyzing the convergence of density evolution in asymptotic conditions, we derive the optimal parameters of CSA, i.e., the set of code selection probabilities of users that maximizes the channel load. A new parameter is introduced to model the packet loss rate of the system, which is non-zero due to potential IC errors. Multi-packet reception (MPR) and the performance of 2-MPR are also studied. We investigate the trade-off between optimal load and packet loss rate, which sheds light on new optimal distributions that outperform known ones. Finally, we show that our asymptotic analytical results are consistent with simulations obtained on a finite number of slots

    Crea.Blender: A Neural Network-Based Image Generation Game to Assess Creativity

    Get PDF
    We present a pilot study on crea.blender, a novel co-creative game designed for large-scale, systematic assessment of distinct constructs of human creativity. Co-creative systems are systems in which humans and computers (often with Machine Learning) collaborate on a creative task. This human-computer collaboration raises questions about the relevance and level of human creativity and involvement in the process. We expand on, and explore aspects of these questions in this pilot study. We observe participants play through three different play modes in crea.blender, each aligned with established creativity assessment methods. In these modes, players "blend" existing images into new images under varying constraints. Our study indicates that crea.blender provides a playful experience, affords players a sense of control over the interface, and elicits different types of player behavior, supporting further study of the tool for use in a scalable, playful, creativity assessment.Comment: 4 page, 6 figures, CHI Pla
    • 

    corecore